翻訳と辞書
Words near each other
・ Ensign, Kansas
・ Ensign-Bickford Company
・ Ensignbus
・ Ensigné
・ Ensiklopedi Umum dalam Bahasa Indonesia
・ EnSilica
・ Ensina
・ Ensing
・ Ensemble 360
・ Ensemble 48
・ Ensemble 96
・ Ensemble Alternance
・ Ensemble Ambrosius
・ Ensemble Ars Nova
・ Ensemble average
Ensemble averaging
・ Ensemble axiom
・ Ensemble cast
・ Ensemble Claude-Gervaise
・ Ensemble Clément Janequin
・ Ensemble collaboration
・ Ensemble Contrechamps
・ Ensemble Cordial
・ Ensemble de Lancement Soyouz
・ Ensemble Denada
・ Ensemble DRAj
・ Ensemble Elyma
・ Ensemble for Intuitive Music Weimar
・ Ensemble forecasting
・ Ensemble für frühe Musik Augsburg


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Ensemble averaging : ウィキペディア英語版
Ensemble averaging

In machine learning, particularly in the creation of artificial neural networks, ensemble averaging is the process of creating multiple models and combining them to produce a desired output, as opposed to creating just one model. Frequently an ensemble of models performs better than any individual model, because the various errors of the models "average out."
== Overview ==
Ensemble averaging is one of the simplest types of committee machines. Along with boosting, it is one of the two major types of static committee machines.〔Haykin, Simon. Neural networks : a comprehensive foundation. 2nd ed. Upper Saddle River N.J.: Prentice Hall, 1999.〕 In contrast to standard network design in which many networks are generated but only one is kept, ensemble averaging keeps the less satisfactory networks around, but with less weight.〔Hashem, S. "Optimal linear combinations of neural networks." Neural Networks 10, no. 4 (1997): 599–614.〕 The theory of ensemble averaging relies on two properties of artificial neural networks:〔Naftaly, U., N. Intrator, and D. Horn. "Optimal ensemble averaging of neural networks." Network: Computation in Neural Systems 8, no. 3 (1997): 283–296.〕
# In any network, the bias can be reduced at the cost of increased variance
# In a group of networks, the variance can be reduced at no cost to bias
Ensemble averaging creates a group of networks, each with low bias and high variance, then combines them to a new network with (hopefully) low bias and low variance. It is thus a resolution of the bias-variance dilemma.〔Geman, S., E. Bienenstock, and R. Doursat. "Neural networks and the bias/variance dilemma." Neural computation 4, no. 1 (1992): 1–58.〕 The idea of combining experts has been traced back to Pierre-Simon Laplace.〔Clemen, R. T. "Combining forecasts: A review and annotated bibliography." International Journal of Forecasting 5, no. 4 (1989): 559–583.〕

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Ensemble averaging」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.